Optimizing Multi-GPU Parallelization Strategies for Deep Learning Training
نویسندگان
چکیده
منابع مشابه
Distributed training strategies for a computer vision deep learning algorithm on a distributed GPU cluster
Deep learning algorithms base their success on building high learning capacity models with millions of parameters that are tuned in a data-driven fashion. These models are trained by processing millions of examples, so that the development of more accurate algorithms is usually limited by the throughput of the computing devices on which they are trained. In this work, we explore how the trainin...
متن کاملMulti-GPU and Multi-CPU Parallelization for Interactive Physics Simulations
Today, it is possible to associate multiple CPUs and multiple GPUs in a single shared memory architecture. Using these resources efficiently in a seamless way is a challenging issue. In this paper, we propose a parallelization scheme for dynamically balancing work load between multiple CPUs and GPUs. Most tasks have a CPU and GPU implementation, so they can be executed on any processing unit. W...
متن کاملParallelization Strategies for Hybrid Metaheuristics Using a Single GPU and Multi-core Resources
Hybrid metaheuristics are powerful methods for solving complex problems in science and industry. Nevertheless, the resolution time remains prohibitive when dealing with large problem instances. As a result, the use of GPU computing has been recognized as a major way to speed up the search process. However, most GPU-accelerated algorithms of the literature do not take benefits of all the availab...
متن کاملMulti-GPU Training of ConvNets
In this work, we consider a standard architecture [1] trained on the Imagenet dataset [2] for classification and investigate methods to speed convergence by parallelizing training across multiple GPUs. In this work, we used up to 4 NVIDIA TITAN GPUs with 6GB of RAM. While our experiments are performed on a single server, our GPUs have disjoint memory spaces, and just as in the distributed setti...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Micro
سال: 2019
ISSN: 0272-1732,1937-4143
DOI: 10.1109/mm.2019.2935967